A secant method for nonsmooth optimization

نویسندگان

  • Adil M. Bagirov
  • Asef Nazari Ganjehlou
چکیده

The notion of a secant for locally Lipschitz continuous functions is introduced and a new algorithm to locally minimize nonsmooth, nonconvex functions based on secants is developed. We demonstrate that the secants can be used to design an algorithm to find descent directions of locally Lipschitz continuous functions. This algorithm is applied to design a minimization method, called a secant method. It is proved that the secant method generates a sequence converging to Clarke stationary points. Numerical results are presented demonstrating the applicability of the secant method in wide variety of nonsmooth, nonconvex optimization problems. We also compare the proposed algorithm with the bundle method using numerical results.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The modified BFGS method with new secant relation ‎for unconstrained optimization problems‎

Using Taylor's series we propose a modified secant relation to get a more accurate approximation of the second curvature of the objective function. Then, based on this modified secant relation we present a new BFGS method for solving unconstrained optimization problems. The proposed method make use of both gradient and function values while the usual secant relation uses only gradient values. U...

متن کامل

Constrained Nonlinear Least Squares: A Superlinearly Convergent Projected Structured Secant Method

Numerical solution of nonlinear least-squares problems is an important computational task in science and engineering. Effective algorithms have been developed for solving nonlinear least squares problems. The structured secant method is a class of efficient methods developed in recent years for optimization problems in which the Hessian of the objective function has some special structure. A pr...

متن کامل

An Improved Local Convergence Analysis for Secant-like Method

We provide a local convergence analysis for Secant– like algorithm for solving nonsmooth variational inclusions in Banach spaces. An existence–convergence theorem and an improvement of the ratio of convergence of this algorithm are given under center–conditioned divided difference and Aubin’s continuity concept. Our result compare favorably with related obtained in [16].

متن کامل

An efficient one-layer recurrent neural network for solving a class of nonsmooth optimization problems

Constrained optimization problems have a wide range of applications in science, economics, and engineering. In this paper, a neural network model is proposed to solve a class of nonsmooth constrained optimization problems with a nonsmooth convex objective function subject to nonlinear inequality and affine equality constraints. It is a one-layer non-penalty recurrent neural network based on the...

متن کامل

Pii: S0168-9274(98)00080-4

A family of Least-Change Secant-Update methods for solving nonlinear complementarity problems based on nonsmooth systems of equations is introduced. Local and superlinear convergence results for the algorithms are proved. Two different reformulations of the nonlinear complementarity problem as a nonsmooth system are compared, both from the theoretical and the practical point of view. A global a...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007